What is markov inequality?

Markov's inequality is a fundamental result in probability theory that provides an upper bound on the probability that a non-negative random variable takes on a value greater than or equal to a given non-negative constant.

Formally, let X be a non-negative random variable and let a > 0 be a non-negative constant. Then, Markov's inequality states that:

P(X >= a) <= E(X)/a

where E(X) denotes the expected value of X. In other words, the probability that X takes on a value greater than or equal to a is bounded above by the ratio of the expected value of X to the constant a.

Markov's inequality is often used to establish upper bounds on tail probabilities of random variables and is a useful tool in probabilistic analysis and bounding techniques. It is especially useful when exact calculations of probabilities are difficult or impossible, allowing for the derivation of simple and tractable bounds on tail probabilities.